🎮 Regulatory Spotlight: AI, UGC & IP in Games — October 2025 Global Update
Prepared by the Word Witch Consulting Regulatory Practice | October 22, 2025
AI Meets Games: Where Law Is Catching Up
As 2025 draws to a close, lawmakers are closing the gap between AI-powered creativity and the realities of copyright, platform liability, and user-generated content.
Here’s what’s moving across the top markets that matter most for studios, publishers, and creators.
1. Who Owns AI-Generated Content?
United States
The U.S. Copyright Office has confirmed that prompting alone is not authorship.
In Thaler v. Perlmutter, the D.C. Circuit reaffirmed that only works with human authorship qualify for protection.
Developers using AI in their pipelines should document the human contributions—editing, curation, composition—that add creative expression.
European Union
EU law still requires an “author’s own intellectual creation,” which means a human.
But the Text and Data Mining (TDM) exceptions under the DSM Directive let AI developers train on data unless rights-holders have opted out. Studios scraping datasets in the EU should check for such reservations.
United Kingdom
Section 9(3) of the Copyright, Designs and Patents Act keeps its unusual rule for computer-generated works, assigning authorship to “the person by whom the arrangements necessary for the creation of the work are undertaken.”
That provision is under government review, and reform is expected in 2026.
Korea
Korea has drawn a firm line: AI is not an author.
The Ministry of Culture, Sports and Tourism and the Korea Copyright Commission issued a June 30 2025 Guide clarifying how to register human-authored works that used GenAI.
Expect registration forms to request disclosure of AI tools and human editing steps.
China
Recent court decisions—Beijing Internet Court (2023) and Changshu People’s Court (2025)—recognize copyright in AI-assisted works where the user made creative choices and iterative edits.
This stands in contrast to the U.S. position and could shape how global publishers structure cross-border content pipelines.
Japan & Singapore
Both jurisdictions focus on data-mining exceptions rather than authorship.
Japan allows text-and-data mining for analysis under Article 30-4, while Singapore’s 2021 Copyright Act lets commercial users perform “computational data analysis” with lawful access.
That makes both hubs relatively friendly for training AI models on locally sourced data.
2. Platform Rules: The DSA Era and Beyond
European Union
The Digital Services Act (DSA) now applies to online games and storefronts that host UGC.
Key duties include:
Notice-and-action systems and “statements of reasons” for moderation
Ad transparency and a ban on personalized ads to minors
Recommender-system controls and data access for vetted researchers
Large services—so-called VLOPs with over 45 million EU users—face direct supervision by the European Commission.
For studios, that means harmonizing moderation, transparency, and child-safety workflows across territories.
United Kingdom
The Online Safety Act (OSA) squarely covers online games.
Ofcom’s October 2025 guidance requires child-safety risk assessments, age-assurance measures, and clear recommender-system controls.
Compliance deadlines began phasing in this summer, with enforcement ramping through 2026.
Australia
The Online Safety Act 2021 now features enforceable industry codes, including age-assurance and anti-harassment obligations.
Its reach is extraterritorial—if you have Australian players, you’re covered.
3. The New AI Regulatory Map
European Union – AI Act
Effective August 2024, the AI Act rolls out through 2027.
By August 2025, General-Purpose AI (GPAI) developers must publish training-data summaries and show copyright-law compliance.
If your game includes generative-AI tools, expect to disclose data-handling practices and secure warranties from upstream model providers.
Korea – AI Basic Act
Promulgated January 2025, effective January 2026, Korea’s AI Basic Act establishes national governance, transparency, and safety standards.
It may soon require foreign AI providers to appoint a Korean representative and disclose training data for generative tools.
Game studios integrating AI locally should monitor MCST and MSIT decrees over the next year.
United States – State Action
Federal AI legislation remains fragmented, but Colorado’s AI Act (SB 24-205) sets a precedent: a risk-based framework for “high-risk” AI systems, effective February 2026.
California follows with deepfake and watermarking bills targeting transparency and child protection.
China – Service-Side Controls
The CAC’s Interim Measures for Generative AI Services (2023) and Deep Synthesis Provisions (2022) require content-moderation, disclosure, and security reviews.
Non-compliant foreign services risk blocking—a real concern for in-game generative features.
4. What This Means for Studios, Publishers & Platforms
Audit your UGC policies.
Require creators to disclose when AI tools are used, clarify ownership of AI-assisted works, and update warranties and indemnities accordingly.
Align moderation workflows.
EU and UK laws now require detailed notices, appeals, and transparency logs.
Design systems that generate compliant “statements of reasons” automatically.
Track your models.
For EU deployments, be ready to publish training-data summaries by August 2025.
In the U.S., identify any “high-risk” AI uses—such as moderation or player-behavior scoring—and document risk-management steps.
Localize for Korea.
Follow MCST’s AI-copyright guides and prepare for the AI Basic Act rollout in 2026.
Label AI-generated assets clearly and maintain human-authorship records for registration.
5. The Bottom Line
The 2025 legal landscape confirms what many in the games industry suspected:
AI isn’t replacing creativity—it’s changing what counts as authorship, platform responsibility, and player safety.
Studios that treat compliance as part of their creative pipeline—not an afterthought—will be best positioned to innovate safely.
References (official & law-firm sources)
U.S. Copyright Office AI Reports (2023-25) | Thaler v. Perlmutter, D.C. Cir. 2025 | EU Digital Services Act (2022) | EU AI Act (2024) | UK Online Safety Act (2023) & Ofcom Guidance (2025) | Korea AI Basic Act (2025) and MCST/KCC Guides (2025) | Colorado AI Act (2024) | China CAC Generative AI Measures (2023).
This newsletter is provided for informational purposes only and does not constitute legal advice. For tailored guidance on implementing these changes within your studio or platform, contact the Word Witch Consulting regulatory practice.